Goto

Collaborating Authors

 Prudhoe Bay


AURORA: Navigating UI Tarpits via Automated Neural Screen Understanding

Khan, Safwat Ali, Wang, Wenyu, Ren, Yiran, Zhu, Bin, Shi, Jiangfan, McGowan, Alyssa, Lam, Wing, Moran, Kevin

arXiv.org Artificial Intelligence

Nearly a decade of research in software engineering has focused on automating mobile app testing to help engineers in overcoming the unique challenges associated with the software platform. Much of this work has come in the form of Automated Input Generation tools (AIG tools) that dynamically explore app screens. However, such tools have repeatedly been demonstrated to achieve lower-than-expected code coverage - particularly on sophisticated proprietary apps. Prior work has illustrated that a primary cause of these coverage deficiencies is related to so-called tarpits, or complex screens that are difficult to navigate. In this paper, we take a critical step toward enabling AIG tools to effectively navigate tarpits during app exploration through a new form of automated semantic screen understanding. We introduce AURORA, a technique that learns from the visual and textual patterns that exist in mobile app UIs to automatically detect common screen designs and navigate them accordingly. The key idea of AURORA is that there are a finite number of mobile app screen designs, albeit with subtle variations, such that the general patterns of different categories of UI designs can be learned. As such, AURORA employs a multi-modal, neural screen classifier that is able to recognize the most common types of UI screen designs. After recognizing a given screen, it then applies a set of flexible and generalizable heuristics to properly navigate the screen. We evaluated AURORA both on a set of 12 apps with known tarpits from prior work, and on a new set of five of the most popular apps from the Google Play store. Our results indicate that AURORA is able to effectively navigate tarpit screens, outperforming prior approaches that avoid tarpits by 19.6% in terms of method coverage. The improvements can be attributed to AURORA's UI design classification and heuristic navigation techniques.


Autonomous Catheterization with Open-source Simulator and Expert Trajectory

Jianu, Tudor, Huang, Baoru, Vo, Tuan, Vu, Minh Nhat, Kang, Jingxuan, Nguyen, Hoan, Omisore, Olatunji, Berthet-Rayne, Pierre, Fichera, Sebastiano, Nguyen, Anh

arXiv.org Artificial Intelligence

Endovascular robots have been actively developed in both academia and industry. However, progress toward autonomous catheterization is often hampered by the widespread use of closed-source simulators and physical phantoms. Additionally, the acquisition of large-scale datasets for training machine learning algorithms with endovascular robots is usually infeasible due to expensive medical procedures. In this chapter, we introduce CathSim, the first open-source simulator for endovascular intervention to address these limitations. CathSim emphasizes real-time performance to enable rapid development and testing of learning algorithms. We validate CathSim against the real robot and show that our simulator can successfully mimic the behavior of the real robot. Based on CathSim, we develop a multimodal expert navigation network and demonstrate its effectiveness in downstream endovascular navigation tasks. The intensive experimental results suggest that CathSim has the potential to significantly accelerate research in the autonomous catheterization field. Our project is publicly available at https://github.com/airvlab/cathsim. Endovascular interventions are commonly performed for the diagnosis and treatment of vascular diseases. This intervention involves the utilization of flexible tools, namely guidewires, and catheters. These instruments are introduced into the body via small incisions and manually navigated to specific body regions through the vascular system [69]. Endovascular tool navigation takes approximately 70% of the intervention time and is utilized for a plethora of vascular-related conditions such as peripheral artery disease, aneurysms, and stenosis [49].


Alaska Schools Get Faster Internet--Partly Thanks to Global Warming

WIRED

Before they got down to business for the day, students in Devin Tatro's social studies class were offered a quiet moment of self-reflection: On this golden fall afternoon at Nome-Beltz Junior/Senior High School, were they feeling chipper, distressed, or somewhere in between? One by one, they selected the picture of the facial expression that best matched their mood, and with a swift click sent an answer to the teacher. She scanned the responses and made a few mental notes. Then, without missing a beat, she switched the smartboard display and launched into a multiple-choice quiz using a game-based online learning platform called Kahoot! "Tell me one thing you remember about yesterday's lesson on expansions and tax on Native Americans," Tatro said, pacing the front of the classroom. She rattled off students' responses as they popped up on the smartboard in a colorful word cloud: "Forced relocation, reduced population, disease, warfare, cultural destruction ... wow, that's a powerful term."


Drones in Hollywood: What Industry Is Next?

AITopics Original Links

This article is by Sean Varah, founder and chief executive of MotionDSP, a company that makes advanced image processing and video analytics software. Last month the Federal Aviation Administration made a decision that marks a significant step for the commercial drone industry, permitting six movie and television production companies the right to use drones. This is the first time the FAA has allowed this type of industry exemption from the rules that prohibit drones from flying in U.S. airspace. Despite Congress' request that it develop standards in support of safe drone use by September 2015, and despite corporate America's campaigning for drone operations, the FAA has been dragging its feet. Thanks to Hollywood and the broader entertainment industry, a door has been opened for commercial drones.